151 research outputs found

    The clinical application of PET/CT: a contemporary review

    Get PDF
    The combination of positron emission tomography (PET) scanners and x-ray computed tomography (CT) scanners into a single PET/CT scanner has resulted in vast improvements in the diagnosis of disease, particularly in the field of oncology. A decade on from the publication of the details of the first PET/CT scanner, we review the technology and applications of the modality. We examine the design aspects of combining two different imaging types into a single scanner, and the artefacts produced such as attenuation correction, motion and CT truncation artefacts. The article also provides a discussion and literature review of the applications of PET/CT to date, covering detection of tumours, radiotherapy treatment planning, patient management, and applications external to the field of oncology

    Why publish? [Editorial]

    Get PDF
    Anybody who has attempted to publish some aspect of their work in an academic journal will know that it isn’t as easy as it may seem. The amount of preparation required of a manuscript can be quite daunting. Besides actually writing the manuscript, the authors are faced with a number of technical requirements. Each journal has their own formatting requirements, relating not only to section headings and text layout, but also to very small details such as placement of commas in reference lists. Then, if presenting data in the form of figures, this must be formatted so that it can be understood by the readership, and most journals still require that the data be in a format which can be read when printed in black-and-white. Most daunting (and important) of all, for the article to be scientifically valid it must be absolutely true in the representation of the work reported (i.e. all data must be shown unless a strong justification exists for removing data points), and this might cause angst in the mind of the authors when the results aren’t clear or possibly contradict the expected or desired result

    Monte Carlo modeling of a 4 mm conical collimator for a Novalis Tx Linear Accelerator

    Get PDF
    The work presented in this poster outlines the steps taken to model a 4 mm conical collimator (BrainLab, Germany) on a Novalis Tx linear accelerator (Varian, Palo Alto, USA) capable of producing a 6MV photon beam for treatment of Stereotactic Radiosurgery (SRS) patients. The verification of this model was performed by measurements in liquid water and in virtual water. The measurements involved scanning depth dose and profiles in a water tank plus measurement of output factors in virtual water using Gafchromic® EBT3 film

    Imaging and radiation interactions of polymer gel dosimeters

    Get PDF
    Aim: The past two decades have seen a large body of work dedicated to the development of a three dimensional gel dosimetry system for the recording of radiation dose distributions in radiation therapy. The purpose of much of the work to date has been to improve methods by which the absorbed dose information is extracted. Current techniques include magnetic resonance imaging (MRI), optical tomography, Raman spectroscopy, x-ray computed tomography (CT) and ultrasound. This work examines CT imaging as a method of evaluating polymer gel dosimeters. Apart from publications resulting from this work, there has been only two other journal articles to date reporting results of CT gel dosimetry. This indicates that there is still much work required to develop the technique. Therefore, the aim of this document is to develop CT gel dosimetry to the extent that it is of use to clinical and research physicists. Scope: Each chapter in this document describes an aspect of CT gel dosimetry which was examined; with Chapters 2 to 7 containing brief technical backgrounds for each aspect. Chapter 1 contains a brief review of gel dosimetry. The first step in the development of any method for reading a signal is to determine whether the signal can actually be obtained. However, before polymer gel dosimeters can be imaged using a CT scanner, imaging techniques are required which are employable to obtain reliable readings. Chapter 2 examines the various artifacts inherent in CT which interfere with the quantitative analysis of gel dosimeters and a method for their removal is developed. The method for artifact reduction is based on a subtraction technique employed previously in a feasibility study and a system is designed to greatly simplify the process. The simplification of the technique removes the requirement for accurate realignment of the phantom within the scanner and the imaging of calibration vials is enabled. Having established a method by which readings of polymer gel dosimeters can be obtained with CT, Chapter 3 examines the CT dose response. A number of formulations of polymer gel dosimeter are studied by varying the constituent chemicals and their concentrations. The results from this chapter can be employed to determine the concentration of chemicals when manufacturing a polymer gel dosimeter with a desired CT dose response. With the CT dose response characterised in Chapter 3, the macroscopic cause of the CT signal is examined in Chapter 4. To this end direct measurement of the linear attenuation coefficient is obtained with a collimated radiation source and detector. Density is measured by Archimedes' principle. Comparison of the two results shows that the cause of the CT signal is a density change and the implications for polymer gel dosimetry are discussed. The CT scanner is revisited in Chapter 5 to examine the CT imaging techniques required for optimal performance. The main limitation of the use of CT in gel dosimetry to date has been image noise. In Chapter 5 stochastic noise is investigated and reduced. The main source of non-stochastic noise in CT is found and imaging techniques are examined which can greatly reduce this residual noise. Predictions of computer simulations are verified experimentally. Although techniques for the reduction of noise are developed in Chapter 5, there may be situations where the noise must be further reduced. An image processing algorithm is designed in Chapter 6 which employs a combination of commonly available image filters. The algorithm and the filters are tested for their suitability in gel dosimetry through the use of a simulated dose distribution and by performing a pilot study on an irradiated polymer gel phantom. Having developed CT gel dosimetry to the point where a suitable image can be obtained, the final step is to investigate the uncertainty in the dose calibration. Methods used for calibration uncertainty in MRI gel dosimetry to date have either assumed a linear response up to a certain dose, or have removed the requirement for linearity but incorrectly ignored the reliability of the data and fit of the calibration function. In Chapter 7 a method for treatment of calibration data in CT gel dosimetry is proposed which allows for non-linearity of the calibration function, as well as the goodness of its fit to the data. Alternatively, it allows for the reversion to MRI techniques if linearity is assumed in a limited dose range. Conclusion: The combination of the techniques developed in this project and the newly formulated normoxic gels (not extensively studied here) means that gel dosimetry is close to becoming viable for use in the clinic. The only capital purchase required for a typical clinic is a suitable water tank, which is easily and inexpensively producible if the clinic has access to a workshop

    Fast tessellated solid navigation in GEANT4

    Get PDF
    Navigation through tessellated solids in GEANT4 can degrade computational performance, especially if the tessellated solid is large and is comprised of many facets. Redefining a tessellated solid as a mesh of tetrahedra is common in other computational techniques such as finite element analysis as computations need only consider local tetrahedrons rather than the tessellated solid as a whole. Here within we describe a technique that allows for automatic tetrahedral meshing of tessellated solids in GEANT4 and the subsequent loading of these meshes as assembly volumes; loading nested tessellated solids and tetrahedral meshes is also examined. As the technique makes the geometry suitable for automatic optimisation using smartvoxels, navigation through a simple tessellated volume has been found to be more than two orders of magnitude faster than that through the equivalent tessellated solid. Speed increases of more than two orders of magnitude were also observed for a more complex tessellated solid with voids and concavities. The technique was benchmarked for geometry load time, simulation run time and memory usage. Source code enabling the described functionality in GEANT4 has been made freely available on the Internet

    Investigation of stereotactic radiotherapy dose using dosimetry film and Monte Carlo simulations

    Get PDF
    This study uses dosimetry film measurements and Monte Carlo simulations to investigate the accuracy of type-a (pencil-beam) dose calculations for predicting the radiation doses delivered during stereotactic radiotherapy treatments of the brain. It is shown that when evaluating doses in a water phantom, the type-a algorithm provides dose predictions which are accurate to within clinically relevant criteria, gamma(3%,3mm), but these predictions are nonetheless subtly different from the results of evaluating doses from the same fields using radiochromic film and Monte Carlo simulations. An analysis of a clinical meningioma treatment suggests that when predicting stereotactic radiotherapy doses to the brain, the inaccuracies of the type-a algorithm can be exacerbated by inadequate evaluation of the effects of nearby bone or air, resulting in dose differences of up to 10% for individual fields. The results of this study indicate the possible advantage of using Monte Carlo calculations, as well as measurements with high-spatial resolution media, to verify type-a predictions of dose delivered in cranial treatments

    Big shoes to fill

    No full text

    Citations equals research quality? If you agree then don’t cite this stupid, totally terrible article

    No full text
    This article accompanies a more serious debate on the value of citations as a measure of research quality [1]. This article has a purpose described in the debate. This article is largely pointless (any points made are unintentional and purely accidental), it is of poor quality (e.g. the first 3 sentences start with ‘This article’), has too many keywords, and contains speeling misteaks, and, questionable, grammar (plus, the tense in the title may or may not be correct). In fact, the web pages of Wikipedia [2] and Youtube [3] are cited for no reason (and incorrectly). It is bad enough that the editor would never accept it into this journal if written by another person or for another purpose. It is, therefore, an editor’s rant.There is conjecture on whether the number of citations an article or journal receives determines the quality and impact. If the argument that citations equals quality is true, then this article should deservedly receive no citations and no further attention. However, if it gains citations or any other attention, does this disprove the notion?Ironically, should this article not gain citations, and thus support the argument, then it might potentially be cited as supporting evidence. But then the very act of proving the argument would then disprove the argument that it proves

    APESM statistics and summary of 2017–2018

    No full text
    corecore